Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Over sampling ensemble algorithm based on margin theory
ZHANG Zongtang, CHEN Zhe, DAI Weiguo
Journal of Computer Applications    2019, 39 (5): 1364-1367.   DOI: 10.11772/j.issn.1001-9081.2018112346
Abstract686)      PDF (597KB)(361)       Save
In order to solve the problem that traditional ensemble algorithms are not suitable for imbalanced data classification, Over Sampling AdaBoost based on Margin theory (MOSBoost) was proposed. Firstly, the margins of original samples were obtained by pre-training. Then, the minority class samples were heuristic duplicated by margin sorting thus forming a new balanced sample set. Finally, the finall ensemble classifier was obtained by the trained AdaBoost with the balanced sample set as the input. In the experiment on UCI dataset, F-measure and G-mean were used to evaluate MOSBoost, AdaBoost, Random OverSampling AdaBoost (ROSBoost) and Random UnderSampling AdaBoost (RDSBoost). The experimental results show that MOSBoost is superior to other three algorithm. Compared with AdaBoost, MOSBoost improves 8.4% and 6.2% respctively under F-measure and G-mean criteria.
Reference | Related Articles | Metrics